DTE AICCOMAS 2025

Domain Decomposition for Randomized Neural Networks

  • Heinlein, Alexander (Delft University of Technology)
  • Kapoor, Taniya (Delft University of Technology)
  • Masri, Rami (Brown University)
  • Zeinhofer, Marius (ETH Zürich)

Please login to view abstract download link

Neural network architectures based on overlapping domain decomposition approaches have emerged as a powerful framework for enhancing the efficiency, scalability, and robustness of physics-informed neural networks (PINNs). In this work, we apply this approach to randomized neural networks (RaNNs) for solving partial differential equations (PDEs). A separate neural network is independently initialized on each subdomain using a uniform distribution, and the networks are combined via a partition of unity. Unlike classical PINNs, only the final layers of these networks are trained, which strongly impacts the resulting optimization problem. The resulting optimization problem, for linear PDEs, reduces to a least-squares formulation, which can be solved using direct solvers for small systems or iterative solvers for larger ones. However, the least-squares problems are generally ill-conditioned, and iterative solvers converge slowly without appropriate preconditioning. To address this, we first apply singular value decomposition (SVD) to remove components with low singular values, improving the conditioning of the system. Additionally, we employ a second type of overlapping domain decomposition in the form of additive and restricted additive Schwarz preconditioners for the least-squares problem, further enhancing solver efficiency. Numerical experiments demonstrate that this dual use of domain decomposition significantly reduces computational time while maintaining accuracy, particularly for multi-scale and time-dependent problems.